Nonlinear Blind Source Separation Using Kernel Feature Spaces
نویسندگان
چکیده
In this work we propose a kernel-based blind source separation (BSS) algorithm that can perform nonlinear BSS for general invertible nonlinearities. For our kTDSEP algorithm we have to go through four steps: (i) adapting to the intrinsic dimension of the data mapped to feature space F , (ii) finding an orthonormal basis of this submanifold, (iii) mapping the data into the subspace of F spanned by this orthonormal basis, and (iv) applying temporal decorrelation BSS (TDSEP) to the mapped data. After demixing we get a number of irrelevant components and the original sources. To find out which ones are the components of interest, we propose a criterion that allows to identify the original sources. The excellent performance of kTDSEP is demonstrated in experiments on nonlinearly mixed speech data.
منابع مشابه
Fast Independent Component Analysis in Kernel Feature Spaces
It is common practice to apply linear or nonlinear feature extraction methods before classification. Usually linear methods are faster and simpler than nonlinear ones but an idea successfully employed in the nonlinearization of Support Vector Machines permits a simple and effective extension of several statistical methods to their nonlinear counterparts. In this paper we follow this general non...
متن کاملKernel Feature Spaces and Nonlinear Blind Source Separation
In kernel based learning the data is mapped to a kernel feature space of a dimension that corresponds to the number of training data points. In practice, however, the data forms a smaller submanifold in feature space, a fact that has been used e.g. by reduced set techniques for SVMs. We propose a new mathematical construction that permits to adapt to the intrinsic dimension and to find an ortho...
متن کاملKernel Feature Spaces and Nonlinear Blind Souce Separation
In kernel based learning the data is mapped to a kernel feature space of a dimension that corresponds to the number of training data points. In practice, however, the data forms a smaller submanifold in feature space, a fact that has been used e.g. by reduced set techniques for SVMs. We propose a new mathematical construction that permits to adapt to the intrinsic dimension and to find an ortho...
متن کاملThe Geometry Of Kernel Canonical Correlation Analysis
Canonical correlation analysis (CCA) is a classical multivariate method concerned with describing linear dependencies between sets of variables. After a short exposition of the linear sample CCA problem and its analytical solution, the article proceeds with a detailed characterization of its geometry. Projection operators are used to illustrate the relations between canonical vectors and variat...
متن کاملUsing Kernel Density Estimator in Nonlinear Mixture
Generally, blind separation of sources from their nonlinear mixtures is rather difficult. This nonlinear mapping, constituted by unsupervised linear mixing followed by unknown and invertible nonlinear distortion, is found in many signal processing cases. We propose using a kernel density estimator incorporated within an equivariant gradient algorithm to separate the nonlinear mixed sources. The...
متن کامل